Open Science.
Challenge or threat?

Wie Forschende Reformbotschaften auf Grundlage verfügbarer Ressourcen bewerten



Jürgen Schneider
Mareike Kunter

29 October 2025

Entwicklung von Open Science



Transparenz und Offenheit von zentralen Stakeholdern unterstützt

Entwicklung von Open Science



Transparenz und Offenheit von zentralen Stakeholdern unterstützt

Top-Down-Policies, z.B. Empfehlungen GEBF & AEPF

“Forschende stellen Materialien zur Vorgehensweise der Datenauswertung reproduzierbar (sofern möglich) und leicht zugänglich bereit, um die Nachvollziehbarkeit, Replizierbarkeit und Weiterverwendung von Ergebnissen zu erleichtern.”


European Research Council
Grantees should describe the protocols used to structure their data and indicate the metadata standards applied. This will allow other scientists to make an assessment, to attempt to reproduce the conclusions derived from the dataset […] and potentially reuse the data for further research. (ERC, 2022, S. 5)

Theorie


  • ~3,2 % der Forschenden teilen ihren code (Krähmer et al., 2023)


  • Interindividuelle Unterschiede:
    • Einerseits: Anstieg der wahrgenommenen Bedeutsamkeit und Vorteile, sowie Zunahme von Open Science Praktiken UNESCO (2023)
    • Andererseits: Open Science policy alienation (Lilja, 2021)


Theorie



  • ~3,2 % der Forschenden teilen ihren code (Krähmer et al., 2023)


  • Interindividuelle Unterschiede:
    • Einerseits: Anstieg der wahrgenommenen Bedeutsamkeit und Vorteile, sowie Zunahme von Open Science Praktiken UNESCO (2023)
    • Andererseits: Open Science policy alienation (Lilja, 2021)


  • CAMCC: Interindividuelle Unterschiede im Umgang mit “reform messages” aus einer pädagogisch-psychologischen Perspektive (Gregoire, 2003)

Theorie

CAMCC


  • “reform messages” werden vor dem Hintergrund verfügbarer Ressourcen bewertet


  • als (positive) challenge oder als threat







  • diese Bewertung bedingt dann die Verarbeitung des Themas

Fragestellungen

Beim Thema Transparenz in der Datenanalyse


  1. Inwiefern hängt die Wahrnehmung verfügbarer Ressourcen mit dem Empfinden der Anforderung als Bedrohung zusammen?

Fragestellungen

Beim Thema Transparenz in der Datenanalyse


  1. Inwiefern hängt die Wahrnehmung verfügbarer Ressourcen mit dem Empfinden der Anforderung als Bedrohung zusammen?

  2. Inwiefern hängt die Wahrnehmung verfügbarer Ressourcen mit dem Empfinden der Anforderung als Herausforderung zusammen?

Fragestellungen

Beim Thema Transparenz in der Datenanalyse


  1. Inwiefern hängt die Wahrnehmung verfügbarer Ressourcen mit dem Empfinden der Anforderung als Bedrohung zusammen?

  2. Inwiefern hängt die Wahrnehmung verfügbarer Ressourcen mit dem Empfinden der Anforderung als Herausforderung zusammen?

  3. Hängen Bedrohungsbewertungen und Herausforderungsbewertungen mit kognitiven Engagement mit dem Thema zusammen?

Methode

Stichprobe

N = 155 Forschende (a priori Power-Analyse: 90% Power)

  • mit einer quantitativ-empirischen Publikation
  • in der Bildungsforschung
  • in 2024
  • als Corresponding Author

Alter: M = 43.01 Jahre (SD = 10,69)
40 Länder vertreten; häufigste:

  • Deutschland (n = 19)
  • USA (n = 16)
  • Türkei (n = 11)

Methode

Design


Ablauf

  • Szenario (“reform message”)

I am usually sure to achieve computationally reproducible results when I try hard.
I have to sacrifice too much to ensure computational reproducibility in my work.
How often have you documented and provided the data as well as data analysis in such a way that other researchers can understand and follow the steps to obtain exactly the same results as you?
My institution regularly offers training opportunities related to computational reproducibility.
I would like to learn something new about computational reproducibility.
I’m worried about whether I will understand the content of the text.
I must confess that I did not read some passages with great concentration.

Methode



Consider the following scenario

Imagine you are applying for a grant from a research funder to conduct a study on the relationship between students’ self-regulated learning strategies and their academic performance. You plan to administer a survey with quantitative measures to a sample of university students, collecting data on their use of specific learning strategies, related academic outcomes, and demographic characteristics.

The funder’s information for grantees states that the data analyses should be computationally reproducible.
“Grantees should allow other scientists to make an assessment, to attempt to reproduce the conclusions derived from the dataset, and potentially reuse the data for further research.”

This means that you should document and provide the data as well as data analysis in such a way that other researchers can understand and follow the steps to obtain exactly the same results as you. Ideally, the other researcher will not have any additional costs (such as having to buy software) that exceed the internet costs of downloading your research materials.

Methode

Design


Ablauf

  • Szenario (“reform message”)
  • Measures
    • Expectancy of Success (\(\omega\) = .81)
    • Loss of Valued Alterntives (\(\omega\) = .82)
    • Experience
    • Infrastruktursupport (\(\omega\) = .91)
    • Challenge Appraisal (\(\omega\) = .86)
    • Threat Appraisal (\(\omega\) = .83)

I am usually sure to achieve computationally reproducible results when I try hard.
I have to sacrifice too much to ensure computational reproducibility in my work.
How often have you documented and provided the data as well as data analysis in such a way that other researchers can understand and follow the steps to obtain exactly the same results as you?
My institution regularly offers training opportunities related to computational reproducibility.
I would like to learn something new about computational reproducibility.
I’m worried about whether I will understand the content of the text.
I must confess that I did not read some passages with great concentration.

Methode

Design


Ablauf

  • Szenario (“reform message”)
  • Measures
  • Informationstext zu Reproduzierbarkeit

I am usually sure to achieve computationally reproducible results when I try hard.
I have to sacrifice too much to ensure computational reproducibility in my work.
How often have you documented and provided the data as well as data analysis in such a way that other researchers can understand and follow the steps to obtain exactly the same results as you?
My institution regularly offers training opportunities related to computational reproducibility.
I would like to learn something new about computational reproducibility.
I’m worried about whether I will understand the content of the text.
I must confess that I did not read some passages with great concentration.

Methode

Design


Ablauf

  • Szenario (“reform message”)
  • Measures
  • Informationstext zu Reproduzierbarkeit
  • Measures
    • Cognitive (Reading) Engagement (\(\omega\) = .70)

I am usually sure to achieve computationally reproducible results when I try hard.
I have to sacrifice too much to ensure computational reproducibility in my work.
How often have you documented and provided the data as well as data analysis in such a way that other researchers can understand and follow the steps to obtain exactly the same results as you?
My institution regularly offers training opportunities related to computational reproducibility.
I would like to learn something new about computational reproducibility.
I’m worried about whether I will understand the content of the text.
I must confess that I did not read some passages with great concentration.

Methode

Design


Ablauf

  • Szenario (“reform message”)
  • Measures
  • Informationstext zu Reproduzierbarkeit
  • Measures
    • Cognitive (Reading) Engagement (\(\omega\) = .70)
    • Surface Reading (\(\omega\) = .80)
    • (Practive-Oriented) Elaboration (\(\omega\) = .76)

I am usually sure to achieve computationally reproducible results when I try hard.
I have to sacrifice too much to ensure computational reproducibility in my work.
How often have you documented and provided the data as well as data analysis in such a way that other researchers can understand and follow the steps to obtain exactly the same results as you?
My institution regularly offers training opportunities related to computational reproducibility.
I would like to learn something new about computational reproducibility.
I’m worried about whether I will understand the content of the text.
I must confess that I did not read some passages with great concentration.
I have thought about what personal experiences I have had so far with regard to the content of the text.

Methode



Statistische Analysen

  • Multivariate Bayesian Regression (brms)
  • Drei Outcomes (Challenge Appraisal, Threat Appraisal, Engagement) simultan modelliert
  • Test gerichteter Hypothesen (siehe Präregistrierung)

Ergebnisse

  • Farbe: Präregistrierte Richtung des Zusammengangs

  • Werte: Standardisierte Regressionsgewichte
    • konform zu Hypothese
    • widerspricht Hypothese

  • Median und 95% HDI
    • threat appraisal: Md R² = 0.15, 95% HDI [.06; .24]

Ergebnisse

  • Farbe: Präregistrierte Richtung des Zusammengangs

  • Werte: Standardisierte Regressionsgewichte
    • konform zu Hypothese
    • widerspricht Hypothese

  • Median und 95% HDI
    • challenge appraisal: Md R² = 0.11, 95% HDI [.04; .19]

Ergebnisse

  • Farbe: Präregistrierte Richtung des Zusammengangs

  • Werte: Standardisierte Regressionsgewichte
    • konform zu Hypothese
    • widerspricht Hypothese

  • Median und 95% HDI
    • Surface reading: Md R² = 0.10, 95% HDI [.04; .18]
    • Elaboration: Md R² = 0.26, 95% HDI [.16; .35]

Diskussion


  • Interpretation von reform messages hängt mit Ressourcen zusammen (auch wenn Varianzaufklärung größer sein könnte)
  • Passend zum CAMCC: Erfolgserwartung, Opportunitätskosten und Unterstützung durch Infrastruktur als Prädiktoren
  • positive Gefühle (challenge) sagen günstige Verarbeitung vorher
  • negative Gefühle (threat) sagen Oberflächenverarbeitung vorher



Limitations

  • Unklar warum Unterstützung durch Infrastruktur zu Bedrohungswahrnehmung führen sollte
  • Unklar warum Opportunitätskosten zu Herausforderung führen sollte
  • Operationalisierung von Erfahrung misst kaum interindividuelle Unterschiede

Thank you



Jürgen Schneider

References

Borycz, J., Olendorf, R., Specht, A., Grant, B., Crowston, K., Tenopir, C., Allard, S., Rice, N. M., Hu, R., & Sandusky, R. J. (2023). Perceived benefits of open data are improving but scientists still lack resources, skills, and rewards. Humanities and Social Sciences Communications, 10(1), 339. https://doi.org/10.1057/s41599-023-01831-7
Cao, H., Dodge, J., Lo, K., McFarland, D. A., & Wang, L. L. (2023). The rise of open science: Tracking the evolution and perceived value of data and methods link-sharing practices. https://doi.org/10.48550/ARXIV.2310.03193
DFG. (2015). Leitlinien zum umgang mit forschungsdaten.
DFG. (2019). Leitlinien zur sicherung guter wissenschaftlicher praxis. kodex.
DGfE, GEBF, & GFD. (2020). Gemeinsame stellungnahme der deutschen gesellschaft für erziehungswissenschaft (DGfE), der gesellschaft für empirische bildungsforschung (GEBF) und der gesellschaft für fachdidaktik (GFD) zur archivierung, bereitstellung und nachnutzung von forschungsdaten in den erziehungs- und bildungswissenschaften und fachdidaktiken.
DGPs. (2021). Management und Bereitstellung von Forschungsdaten in der Psychologie: Überarbeitung der DGPs-Empfehlungen: DGPs-Kommission ,,Open Science“ (beschlossen durch den Vorstand der DGPs am 26. 06. 2020). Psychologische Rundschau, 72(2), 132–146. https://doi.org/10.1026/0033-3042/a000514
ERC. (2022). Open research data and data management plans. Information for ERC grantees.
ERC. (2023). Guidelines on implementation of open access to scientific publications and research data.
European Commission. (2016). Open innovation, open science, open to the world: A vision for europe. Publications Office.
European Commission. (2017). H2020 programme. Guidelines to the rules on open access to scientific publications and open access to research data in horizon 2020.
European Commission. (2018). Turning FAIR into reality: Final report and action plan from the european commission expert group on FAIR data. Publications Office.
Ferguson, J., Littman, R., Christensen, G., Paluck, E. L., Swanson, N., Wang, Z., Miguel, E., Birke, D., & Pezzuto, J.-H. (2023). Survey of open science practices and attitudes in the social sciences. Nature Communications, 14(1), 5401. https://doi.org/10.1038/s41467-023-41111-1
Gregoire, M. (2003). Is it a challenge or a threat? A dual-process model of teachers’ cognition and appraisal processes during conceptual change. Educational Psychology Review, 15(2), 147–179. https://doi.org/10.1023/A:1023477131081
Krähmer, D., Schächtele, L., & Schneck, A. (2023). Care to share? Experimental evidence on code sharing behavior in the social sciences. PLOS ONE, 18(8), e0289380. https://doi.org/10.1371/journal.pone.0289380
Lilja, E. (2021). Threat of policy alienation: Exploring the implementation of open science policy in research practice. Science and Public Policy, 47(6), 803–817. https://doi.org/10.1093/scipol/scaa044
UNESCO. (2022). UNESCO recommendation on open science.
UNESCO. (2023). Open science outlook 1: Status and trends around the world. UNESCO. https://doi.org/10.54677/GIIC6829

Credit

Titel Foto von Jukan Tateisi auf Unsplash

Icons by Font Awesome CC BY 4.0

————

Methode

Stichprobe

  • Rekrutierung: Corresponding Authors quantitativ-empirischer Publikationen (2024) im Bereich Education aus Web of Science
    • Ausgangsdatensatz: N = 48,484 Publikationen
    • Nach Filterung: N = 10,960 quantitativ-empirische Studien
    • Zufallsstichprobe in drei Batches (je 1550, 3100, 3100)
  • Finale Stichprobe: N = 155 Forschende (a priori Power-Analyse: 90% Power)
  • Geschlecht: 73 weiblich, 81 männlich, 1 nicht-binär
  • Alter: M = 43.01 Jahre (SD = 10.69)
  • Karrierestufe:
    • Doktorand*innen: n = 28 (18.1%)
    • Postdocs: n = 21 (13.5%)
    • Professur (Junior/Mid/Senior): n = 95 (61.3%)
    • Andere (Research staff/Non-faculty): n = 11 (7.1%)
  • Länder: 40 Länder vertreten; häufigste: Deutschland (n = 19), USA (n = 16), Türkei (n = 11)
  • Incentivierung: Spende von 8,50€ pro Teilnahme an eine von vier Charities

Methode

Methode

Bereich Skala Beispielitem Antwortformat
RESOURCES MOTIVATION EVM: Expectancy of Success I am usually sure to achieve computationally reproducible results when I try hard. 6-point Likert scale
EVM: Loss of valued alternatives I have to sacrifice too much to ensure computational reproducibility in my work. 6-point Likert scale
RESOURCES ABILITY Experience How often have you documented and provided the data as well as data analysis in such a way that other researchers can understand and follow the steps to obtain exactly the same results as you? Numeric
Infrastructural resources My institution regularly offers training opportunities related to computational reproducibility. 6-point Likert scale
APPRAISAL Challenge appraisal I would like to learn something new about computational reproducibility. 6-point Likert scale
Threat appraisal I'm worried about whether I will understand the content of the text. 6-point Likert scale
PROCESSING Cognitive engagement I must confess that I did not read some passages with great concentration. 6-point Likert scale
Behavioral engagement Would you like to download the information text and further practical suggestions on how you can improve the computational reproducibility of your data analyses? Yes/No

Ergebnisse

Theorie

  • Forschung entwickelt sich konstant weiter, Forschende müssen immer wieder mit

  • Reform messages by DFG, … + genutzter Ausschnitt zeigen

  • Stellt bisherige Praktiken infrage (~3,2 % der Forschenden teilen ihren code, Krähmer et al. 2023)

  • Es zeigt sich allerdings, dass es ein geringes Engagement

  • eine der zentralen Quellen für eine fehlende Umsetzung werden fehlende Ressourcen angegeben

-> Einführunge Gregoire * Diese determinieren, inwiefern die reform message als eine herausforderung oder threat wahrgenommen wird * und sich deshalb mit den Inhalten auseinandersezten -> reform fatique vs. erfolgreiche umsetzung

Development of Open Science




Transparency and openness endorsed by key players

  • DFG (2015, 2019)
  • ERC (2022, 2023)
  • scientific societies (e.g., DGPs, 2021)
  • UNESCO (2022)

Engagement with open science policies

  • Risk of Policy alienation (Lilja, 2021)
  • Barrier to Policy compliance are resources (lack of clarity on how to, lack of training) (Dumanis et al., 2023)

Met with an increase in the

  • perceived importance and benefits (Borycz et al., 2023; Ferguson et al., 2023)
  • implementation of open research practices (Cao et al., 2023; UNESCO, 2023)

Development of Open Science



A the same time: Many researchers struggle.

Comparatively low rate of

  • open data (2014-2017: 1%, 2018: 0.32%, 2020: 7.16%)
  • data analysis scripts (2014-2017: 1%)

(Hardwicke et al., 2022; Huff & Bongartz, 2023)

Because researchers lack resources such as

  • adequate training
  • designated project funding
  • infrastructure for data openness

(European Commission, 2023; Goodey et al., 2022; Houtkoop et al., 2018)

CAMCC model




Model for

  • predicting processing and conceptual change
  • (among other things) based on available resources.

(Gregoire, 2003)

CAMCC model




Focusing on

  • Resources
  • Challenge vs. threat appraisal
  • Processing depth

(Gregoire, 2003)

Study

Methods



  • Observational (survey) study (power analysis: N=120 researchers)
  • Procedure
    • vignette about the need for reproducible data analysis
    • measures (resources, appraisal)
    • information text on “how to reproducible data analysis”
    • measures on cognitive and behavioral engagement

Access the survey under this link.




Study

Vignette (option 1)

Think of a current research project in which you are collecting and analyzing quantitative data. Now imagine that the research team assigns you the task of ensuring that your data and results are computationally reproducible.

This means that you have to provide the data and analysis code in such a way that another researcher can use it and produce exactly the same results as you. Ideally, the other researcher will not have any additional costs (such as having to buy software) that exceed the internet costs of downloading your research materials. Ideally, you should also take into account that your analyses run on different system requirements (e.g., Windows, Mac) and software versions (e.g., older versions).

Study

Vignette (option 2)

adopted from ERC (2022) “Open Research Data and Data Management Plans. Information for ERC grantees” (V4.1)

Imagine you are planning to submit a study to a research funder on the relationship between students’ self-directed learning strategies and their academic performance. A survey with quantitative measures will be administered to a sample of university students, collecting data on their use of specific learning strategies, corresponding academic outcomes and demographic variables.

The information for grantees from the research funder states that the data analyses should be computationally reproducible.
“Grantees should allow other scientists to make an assessment, to attempt to reproduce the conclusions derived from the dataset, and potentially reuse the data for further research.”

This means that you should provide the data and analysis code in such a way that other researchers can use it and produce exactly the same results as you. [Ideally, the other researcher will not have any additional costs (such as having to buy software) that exceed the internet costs of downloading your research materials.] [Ideally, you should also take into account that your analyses run on different system requirements (e.g., Windows, Mac) and software versions (e.g., older versions)].


The sentences in [square brackets] will be added cumulatively in two further conditions.

Results

Hypothesis 1

Researchers differ in whether they perceive the demand for transparency in data analysis as a threat or a challenge.

Challenge

Threat

Hypothesis 2a

Higher levels of personal resources (e.g., stronger expectancy of success, lower perceived loss of valued alternatives), infrastructural support, and prior experience are associated with lower threat appraisals regarding transparency in data analysis.

Expectancy of success

Loss of valued alternatives

Infrastructural support

Experience

Hypothesis 2b

Higher levels of personal resources (e.g., stronger expectancy of success, lower perceived loss of valued alternatives), infrastructural support, and prior experience are associated with stronger challenge appraisals regarding transparency in data analysis.

Expectancy of success

Loss of valued alternatives

Infrastructural support

Experience

Hypothesis 3a

Stronger threat appraisals of transparency in data analysis are associated with lower cognitive and behavioral engagement with the topic.

Cognitive engagement

Behavioral engagement

Hypothesis 3b

Stronger challenge appraisals of transparency in data analysis are associated with higher cognitive and behavioral engagement with the topic.

Cognitive engagement

Behavioral engagement

Ergebnisse

  • Farbe: Präregistrierte Richtung des Zusammengangs

  • Werte: Standardisierte Regressionsgewichte
    • konform zu Hypothese
    • widerspricht Hypothese

  • Alle Pfade BF > 5 (präregistrierte Inferenzschwelle)

  • Median und 95% HDI
    • challenge appraisal: Md R² = 0.11, 95% HDI [.036; .19]
    • threat appraisal: Md R² = 0.15, 95% HDI [.06; .24]
    • reading engagement: Md R² = 0.19, 95% HDI [.10; .28]